Sparsely Connected Convolutional Networks
نویسندگان
چکیده
Residual learning [6] with skip connections permits training ultra-deep neural networks and obtains superb performance. Building in this direction, DenseNets [7] proposed a dense connection structure where each layer is directly connected to all of its predecessors. The densely connected structure leads to better information flow and feature reuse. However, the overly dense skip connections also bring about the problems of potential risk of overfitting, parameter redundancy and large memory consumption. In this work, we analyze the feature aggregation patterns of ResNets and DenseNets under a uniform aggregation view framework. We show that both structures densely gather features from previous layers in the network but combine them in their respective ways: summation (ResNets) or concatenation (DenseNets). We compare the strengths and drawbacks of these two aggregation methods and analyze their potential effects on the networks’ performance. Based on our analysis, we propose a new structure named SparseNets which achieves better performance with fewer parameters than DenseNets and ResNets.
منابع مشابه
Sparsely-Connected Neural Networks: Towards Efficient VLSI Implementation of Deep Neural Networks
Recently deep neural networks have received considerable attention due to their ability to extract and represent high-level abstractions in data sets. Deep neural networks such as fully-connected and convolutional neural networks have shown excellent performance on a wide range of recognition and classification tasks. However, their hardware implementations currently suffer from large silicon a...
متن کاملWards Efficient Vlsi Implementation of Deep Neural Networks
Recently deep neural networks have received considerable attention due to their ability to extract and represent high-level abstractions in data sets. Deep neural networks such as fully-connected and convolutional neural networks have shown excellent performance on a wide range of recognition and classification tasks. However, their hardware implementations currently suffer from large silicon a...
متن کاملMerging and Evolution: Improving Convolutional Neural Networks for Mobile Applications
Compact neural networks are inclined to exploit “sparsely-connected” convolutions such as depthwise convolution and group convolution for employment in mobile applications. Compared with standard “fully-connected” convolutions, these convolutions are more computationally economical. However, “sparsely-connected” convolutions block the inter-group information exchange, which induces severe perfo...
متن کاملCayleyNets: Graph Convolutional Neural Networks with Complex Rational Spectral Filters
The rise of graph-structured data such as social networks, regulatory networks, citation graphs, and functional brain networks, in combination with resounding success of deep learning in various applications, has brought the interest in generalizing deep learning models to non-Euclidean domains. In this paper, we introduce a new spectral domain convolutional architecture for deep learning on gr...
متن کاملNoise Injection Into Inputs In Sparsely Connected Hopfield And Winner-take-all Neural Networks - Systems, Man and Cybernetics, Part B, IEEE Transactions on
In this paper, we show that noise injection into inputs in unsupervised learning neural networks does not improve their performance as it does in supervised learning neural networks. Specifically, we show that training noise degrades the classification ability of a sparsely connected version of the Hopfield neural network, whereas the performance of a sparsely connected winner-take-all neural n...
متن کاملNoise injection into inputs in sparsely connected Hopfield and winner-take-all neural networks
In this paper, we show that noise injection into inputs in unsupervised learning neural networks does not improve their performance as it does in supervised learning neural networks. Specifically, we show that training noise degrades the classification ability of a sparsely connected version of the Hopfield neural network, whereas the performance of a sparsely connected winner-take-all neural n...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1801.05895 شماره
صفحات -
تاریخ انتشار 2018